-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[LLM] Reconstruct fused transformer layers #7186
[LLM] Reconstruct fused transformer layers #7186
Conversation
Thanks for your contribution! |
Codecov Report
@@ Coverage Diff @@
## develop #7186 +/- ##
===========================================
- Coverage 59.39% 59.33% -0.06%
===========================================
Files 567 567
Lines 83114 83190 +76
===========================================
Hits 49364 49364
- Misses 33750 33826 +76
|
paddlenlp/experimental/transformers/fused_transformer_layers.py
Outdated
Show resolved
Hide resolved
paddlenlp/experimental/transformers/fused_transformer_layers.py
Outdated
Show resolved
Hide resolved
这个PR 导致之前的推理单测挂了:https://xly.bce.baidu.com/paddlepaddle/Paddle-NLP/newipipe/detail/9304392/job/24091206 这样看来可能会影响之前模型的精度。 |
合入 #7187 之后你这边有冲突,麻烦解决一下冲突。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Function optimization
PR changes
Others
Description
Reconstruct class FusedMultiTransformer